348 research outputs found
Approximation Algorithms for Sparse Best Rank-1 Approximation to Higher-Order Tensors
Sparse tensor best rank-1 approximation (BR1Approx), which is a sparsity
generalization of the dense tensor BR1Approx, and is a higher-order extension
of the sparse matrix BR1Approx, is one of the most important problems in sparse
tensor decomposition and related problems arising from statistics and machine
learning. By exploiting the multilinearity as well as the sparsity structure of
the problem, four approximation algorithms are proposed, which are easily
implemented, of low computational complexity, and can serve as initial
procedures for iterative algorithms. In addition, theoretically guaranteed
worst-case approximation lower bounds are proved for all the algorithms. We
provide numerical experiments on synthetic and real data to illustrate the
effectiveness of the proposed algorithms
Practical Approximation Algorithms for -Regularized Sparse Rank- Approximation to Higher-Order Tensors
Two approximation algorithms are proposed for -regularized sparse
rank-1 approximation to higher-order tensors. The algorithms are based on
multilinear relaxation and sparsification, which are easily implemented and
well scalable. In particular, the second one scales linearly with the size of
the input tensor. Based on a careful estimation of the -regularized
sparsification, theoretical approximation lower bounds are derived. Our
theoretical results also suggest an explicit way of choosing the regularization
parameters. Numerical examples are provided to verify the proposed algorithms
- β¦